Computing as an Appliance: Apple's Contribution
or, How Apple Ruined Computing for All Time
As people know by now, I have been calling the PC industry an appliance industry pretty much since around 2007. I've tried to justify this position in several ways over the years, but with mixed results. I mean, it should come as no surprise when the following events all occurred roughly at the same time:
Moore's Law officially ended. You can argue that transistor densities are still doubling, but realistically, it's not happening anywhere close to the 18 month innovation cycle that Gordon Moore predicted. Performance certainly is not doubling. What's happening is vendors are putting more dies onto the same package, or are stacking them to form 3D circuits, or straight up using larger dies. Have you seen how big a contemporary Xeon processor die is these days? Densities are no longer doubling every 18 months, and haven't been for a long time now. Performance gains are approaching linear rather than exponential growth, and that's still only in the best of cases (e.g., server-class Xeon processors).
Microsoft and Intel collaborate to produce the latest incarnation of the Trusted Computing Platform, including what would eventually become known as UEFI. While UEFI is an "open" standard and even has source code on Github, the license it's under virtually guarantees proprietary implementations. It's even in the FAQ.
Consumers starting to realize that Microsoft's old revenue stream engine, namely the continued introduction of new glitz over new features, and new features over fixing quite systemic and severe bugs, even at the cost of a customer's security, resulted in ever-slower up-take of their copies of both Windows and Office. Remember when Windows 8 came out, and how nobody at all wanted to upgrade to it? It was a complete disaster for Microsoft, ultimately resorting to them releasing Windows 10 and then coercing customers to upgrade against their will. And, this link is just one of a ton of examples of this. All you gotta do is Google.
As a result, Microsoft needed a platform which was tightly controlled by them:
Control over the operating systems supported allows Microsoft the ability to refuse freedom of operating system choice. This is why it's so rare to find Microsoft Surface RT tablets with Linux or BSD on them, for instance. The UEFI on those devices just won't allow it. (This seems to be resolved with Surface Pro tablets, perhaps due to "intense" customer feedback.) And while Microsoft vowed to never curtail the ability to load Linux on a desktop-class PC, they made no such promises for other classes of devices.
Control over the hardware. This is where the Surface tablets all come from. Since the PC market is dominated by clones and compatibles of the original IBM PC architecture, Microsoft had no way to impose control over the architecture or manufacturing processes to make new PC-compatible motherboards. (They tried once before in early 2000s and it failed spectacularly.) As a result, if you pour through magazine articles from that era, you see article after article bemoaning the death of the PC as mobile devices increased up-take. This creates a positive feedback loop, where fewer people buy PCs because they are lead to believe that PCs are a lost cause. The PC was, and basically is, a loss-leader for many companies, so this was the perfect opportunity for them to drive their nails into the PC's coffin. The real money makers are on dedicated appliances.
Given those two principles, Microsoft controlled every mechanism by which a casual user would put software on the device, thus guaranteeing they had a revenue stream.
While I can't fault them for wanting to ensure continuing revenue, I can certainly fault them for failing to be very imaginative in addressing customer's needs. Many customers neither wanted nor asked for an appliance form-factor, they wanted enhanced core technology instead. While that came eventually (preemptive multitasking came to Windows more than 10 years after the Commodore-Amiga's introduction to the market, while useful memory protection features didn't arrive in Windows until the year 2000!), there's no denying that the big selling point for the Windows and Office products in particular, and many 3rd party products designed to be compatible with them, was visual glitz and how intuitive the user interface was intended to be (but rarely was). To this day, products are often sold with no meaningful change in utility, but solely with different user interface layouts. When was the last time you upgraded Office, and realized you could actually use some new feature in Word to your benefit? Every document I write today can still be written in Word for Windows 3.1. Once you have the perfect (or, at least, a good enough) product, how else do you compete? Vis-a-vis, automotive manufacturers.
I'm happy to see others are coming to similar conclusions regarding computing-as-an-appliance, and doing so with very different data sets. About a week ago, I saw a toot-stream (sources at the end of this article) which provides a cogent and poignant chronicle of what happened in the industry. With the author's permission, I reproduce their essay below. Please note: I reproduce it as it was typed. I've made no effort to significantly edit the material. Spelling, capitalization, and punctuation errors are left as-is. Mastodon, the social media interface I drew the content from, imposes content limits like Twitter (just bigger), and much of the content's unconventional structure stems from those limitations.
Mona Drafter writes,
Apple ruined computing for all time
before talking about how Apple inflicted a mortal wound on personal computing, I want to talk about something more mundane and familiar: the consumer good. the appliance.
take, for example, the electric stove. I have used probably at least a dozen of these in my life, all provided with the house or apartment, and they have all been the same.
you get some resistive heating coils (which usually don't even support a pot levelly, and which make contact with the pot at only two or three points) controlled by an "open-loop" circuit, i.e. with no feedback for regulation of temperature, just the war between resistive heating and heat loss through radiation and conduction, which varies widely upon conditions, so you can never be assured of even, reproducible heating. it's rubbish.
I do not know exactly how old this design for an electric stove is, but it dates back approximately a century, perhaps more. and it is almost the only model of electric stove available to the general public.
purchasing more expensive models of the electric stove gets you some extra polish. you might get a layer of glass over the heating coils. the panel controls are slicker. overall build quality might be better
but the same stove.
this is the general pattern with consumer goods under capitalism. to get any real difference you need to look outside the consumer market, to far more expensive devices sold to professional cooks. (I'm aware of inductive cooktops but I do not consider these a replacement, for they are not for general purpose heating of cookware.)
what Apple tried to do, what has doomed us today, is to drag personal computers into this dead zone.
I am old enough to remember the previous generation of personal computers, before Apple;s Macintosh set the tone for the rest of the century and beyond--machines like the Commodore-64 and the older-generation Apples: they were both instantly usable and instantly programmable.
there was nothing you couldn't do on your C-64 right out of the box. it came with complete technical specifications. AND you could just run apps on it.
Apple, quite deliberately, set out to wreck this. they wished to make a personal computer that was like a "white good", like an electric stove, frozen in place. out of the box, the Macintosh was a completely inflexible machine.
programming the Macintosh originally required the "Macintosh Programmer's Workshop", which I attempted to use once (I have done some System 7 programming though it was a long time ago.)
MPW was a nightmare.
MPW was a hideously expensive, strange software development environment that was a sort of own-brand command-line environment without the advantage of UNIX conciseness, and what seemed like almost wilfully obtuse syntax. (the "curly-d" partial differential symbol was an important token for some reason.)
eventually third-party developers would supply better and more affordable environments but the initial message was clear
by making the only way to program the Macintosh a byzantine horror, and charging hundreds of dollars for it, Apple was making it quite clear that whatever ideas they were purporting to extol about ease of use in personal computing, ideas mostly stolen from Xerox, they did NOT want those ideas to apply to programming. they wanted that to be difficult and arcane
they wanted to freeze computing in place, and dictate its course
it did not have to be this way. ironically, Apple gave us at least an indication of how it could have been different, with HyperCard
here was a development environment that was actually consistent with the Macintosh user experience and not some bizarre command-line monstrosity running parallel to it
but it was rather a toy, albeit a flexible one, and like all of Apple's really good ideas, they abandoned it
Apple's fatal sundering of usability and programmability has informed the entire personal computing world since then, corrupting and ruining it
they froze interface development. everything since has just been copies of copies of copies of that original Xerox work
more importantly, everyone copied how Apple sold and marketed their devices. everyone wanted to make personal computers into mere appliances after Apple.
the Apple philosophy infected the smart phone, and as a consequence smart phones are some of the worst consumer devices imaginable, inflexible and infuriating, loaded with unnecessary software that can't be removed, programmable only with great difficulty. the Android SDK is a deliberate disaster. it's not meant to be usable by the ordinary human.
and casting a pall over the entire industry is a lesson ground into us from childhood now, reinforced by a toxic computer-geek community: This Is The Way It Must Be
nobody can imagine personal computers working any differently now
we're just supposed to live with this wretched state of affairs that Apple and Steve Jobs helped to create, decades ago.
well...I'm dedicated to putting an end to it.
this is not my field. I am a physical scientist, a chemist. I did not want to spend the waning years of my life and my intellect fighting with computers, but I think it's necessary. more than ever
we need to wrest these powerful and wonderful machines away from their current masters and give them back to the people, all people, not just a priesthood of computer geeks
I'd like to thank Mona Drafter for letting me collect their essay and reproduce it in full on my website.
Sources for the individual messages:
- https://icosahedron.website/@mona/100788977959049409
- https://icosahedron.website/@mona/100788985866351133
- https://icosahedron.website/@mona/100788995399048010
- https://icosahedron.website/@mona/100789003903112907
- https://icosahedron.website/@mona/100789012678038262
- https://icosahedron.website/@mona/100789021186845755
- https://icosahedron.website/@mona/100789078098729751
- https://icosahedron.website/@mona/100789132411064551
- https://icosahedron.website/@mona/100789143095697450
- https://icosahedron.website/@mona/100789154274061897
- https://icosahedron.website/@mona/100789169373577878
- https://icosahedron.website/@mona/100789178820995340
- https://icosahedron.website/@mona/100789187281232147